I'm developing a drawing app. I use MTKView to render the canvas. But for some reason and for only a few users, the pixels are not rendered correctly (pixels have different sizes), the majority of users have no problem with this. Here is my setup:
Each pixel is rendered as 2 triangles
MTKView's frame dimensions are always multiple of the canvas size (a 100x100 canvas will have the frame size of 100x100, 200x200, and so on)
There is a grid to indicate pixels (it's an SwiftUI Path) which display correctly, and we can see that they don't align with the pixels).
There is also a checkerboard pattern in the background rendered using another MTKView which lines up with the pixels but not the grid.
Previously, I had a similar issue when my view's frame is not a multiple of the canvas size, but I fixed that with the setup above already.
The issue worsens when the number of points representing a pixel of the canvas becomes smaller. E.g. a 100x100 canvas on a 100x100 view is worse than a 100x100 canvas on a 500x500 view
The vertices have accurate coordinates, this is a rendering issue. As you can see in the picture, some pixels are bigger than others.
I tried changing the contentScaleFactor to 1, 2, and 3 but none seems to solve the problem.
My MTKView setup:
clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
delegate = renderer
renderer.setup()
isOpaque = false
layer.magnificationFilter = .nearest
layer.minificationFilter = .nearest
Renderer's setup:
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.vertexFunction = vertexFunction
pipelineDescriptor.fragmentFunction = fragmentFunction
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineState = try? device.makeRenderPipelineState(descriptor: pipelineDescriptor)
Draw method of renderer:
commandEncoder.setRenderPipelineState(pipelineState)
commandEncoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0)
commandEncoder.setVertexBuffer(colorBuffer, offset: 0, index: 1)
commandEncoder.drawIndexedPrimitives(
type: .triangle,
indexCount: indexCount,
indexType: .uint32,
indexBuffer: indexBuffer,
indexBufferOffset: 0
)
commandEncoder.endEncoding()
commandBuffer.present(drawable)
commandBuffer.commit()
Metal file:
struct VertexOut {
float4 position [[ position ]];
half4 color;
};
vertex VertexOut frame_vertex(constant const float2* vertices [[ buffer(0) ]],
constant const half4* colors [[ buffer(1) ]],
uint v_id [[ vertex_id ]]) {
VertexOut out;
out.position = float4(vertices[v_id], 0, 1);
out.color = colors[v_id / 4];
return out;
}
fragment half4 frame_fragment(VertexOut v [[ stage_in ]]) {
half alpha = v.color.a;
return half4(v.color.r * alpha, v.color.g * alpha, v.color.b * alpha, v.color.a);
}
Post
Replies
Boosts
Views
Activity
I'm using MTKView to render some triangles. Everything works fine until I decrease the color's saturation and opacity value which makes the triangle completely transparent. Creating SwiftUI's Color with the same values shows correctly. This only happens for colors with "low" saturation, if the color has 100% saturation (like #FF0000), it still renders fine even with just 1% opacity.
I noticed if I change the colorPixelFormat of the MTKView, the result will change. So not sure if I only need to change the colorPixelFormat to fix this, in that case, I don't know which one either as I have limited knowledge about graphics. Here is an example for color #FF8888:
bgra8Unorm: minimum 55% opacity for it to render
bgra8Unorm_srgb: minimum 77% opacity for it to render and the color is a lot lighter than what it should be.
In Swift, I store the colors as [Float], in MSL, it will be converted to float4*. Nothing fancy with the vertex and fragment functions, just returning the input. This is not very likely where the issue lies as other colors work.
Some code to show my setup:
// MTKView's setup
clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
isOpaque = false
layer.magnificationFilter = .nearest
layer.minificationFilter = .nearest
// State setup
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.vertexFunction = vertexFunction
pipelineDescriptor.fragmentFunction = fragmentFunction
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineState = try? device.makeRenderPipelineState(descriptor: pipelineDescriptor)
// draw method setup
guard let vertexBuffer = vertexBuffer,
let indexBuffer = indexBuffer,
let indexCount = indexCount,
let colorBuffer = colorBuffer,
let pipelineState = pipelineState,
let discriptor = view.currentRenderPassDescriptor,
let commandBuffer = commandQueue.makeCommandBuffer(),
let commandEncoder = commandBuffer.makeRenderCommandEncoder(
descriptor: discriptor
),
let drawable = view.currentDrawable else {
return
}
commandEncoder.setRenderPipelineState(pipelineState)
commandEncoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0)
commandEncoder.setVertexBuffer(colorBuffer, offset: 0, index: 1)
commandEncoder.drawIndexedPrimitives(
type: .triangle,
indexCount: indexCount,
indexType: .uint32,
indexBuffer: indexBuffer,
indexBufferOffset: 0
)
commandEncoder.endEncoding()
commandBuffer.present(drawable)
commandBuffer.commit()
Hi all,
I'm using SwiftUI 2 and I need to receive a callback in my app:
didRegisterForRemoteNotificationsWithDeviceToken
I'm putting this in my @main App struct:
@UIApplicationDelegateAdaptor(AppDelegate.self) private var appDelegate
The problem is that only willFinishLaunchingWithOptions and didFinishLaunchingWithOptions get called. The rest just don't get called at all. I'm fine with the other as there are modifiers to replace, but there is nothing for the APN token callback.
I'm stuck here.
Any help is much appreciated. Thanks